Goto

Collaborating Authors

 single-model uncertainty


Single-Model Uncertainties for Deep Learning

Neural Information Processing Systems

We provide single-model estimates of aleatoric and epistemic uncertainty for deep neural networks. To estimate aleatoric uncertainty, we propose Simultaneous Quantile Regression (SQR), a loss function to learn all the conditional quantiles of a given target variable. These quantiles can be used to compute well-calibrated prediction intervals. To estimate epistemic uncertainty, we propose Orthonormal Certificates (OCs), a collection of diverse non-constant functions that map all training samples to zero. These certificates map out-of-distribution examples to non-zero values, signaling epistemic uncertainty. Our uncertainty estimators are computationally attractive, as they do not require ensembling or retraining deep models, and achieve state-of-the-art performance.


Reviews: Single-Model Uncertainties for Deep Learning

Neural Information Processing Systems

This work presents ways to obtain estimates of the aleatoric and epistemic uncertainties of deep neural networks. The aleatoric uncertainty is estimated by learning the quantiles of the target variable via Simultaneous Quantile Regression (SQR); it minimizes the pinball loss where the target quantile is randomly sampled in every training iteration. The epistemic uncertainty is implicitly estimated by Orthonormal Certificates (OCs); these are functions that are trained to map in-distribution examples to zero whereas out-of-distribution examples to non-zero values. The authors also provide tail bounds for the OCs in the case of Gaussian input data, which does provide some intuition about the behaviour. Simplicity is a benefit of these estimators and the authors demonstrate their performance on regression and classification tasks.


Reviews: Single-Model Uncertainties for Deep Learning

Neural Information Processing Systems

The paper presents an approach for estimating aleatoric uncertainty which leverages the pinball loss in quantile regression, and orthonormal certificates for measuring epistemic uncertainty. Although the pinball loss has been used in prior work, its randomized version for simultaneously optimizing for all quantiles is novel. In addition the novelty of the OC approach for the filtering task is significant. Overall the value of the proposed methods is convincingly demonstrated on a variety of datasets. The reviewers and AC have carefully examined the author feedback and feel that the feedback adequately addresses the concerns raised in the reviewers.


Single-Model Uncertainties for Deep Learning

Neural Information Processing Systems

We provide single-model estimates of aleatoric and epistemic uncertainty for deep neural networks. To estimate aleatoric uncertainty, we propose Simultaneous Quantile Regression (SQR), a loss function to learn all the conditional quantiles of a given target variable. These quantiles can be used to compute well-calibrated prediction intervals. To estimate epistemic uncertainty, we propose Orthonormal Certificates (OCs), a collection of diverse non-constant functions that map all training samples to zero. These certificates map out-of-distribution examples to non-zero values, signaling epistemic uncertainty.


Single-Model Uncertainties for Deep Learning

Tagasovska, Natasa, Lopez-Paz, David

Neural Information Processing Systems

We provide single-model estimates of aleatoric and epistemic uncertainty for deep neural networks. To estimate aleatoric uncertainty, we propose Simultaneous Quantile Regression (SQR), a loss function to learn all the conditional quantiles of a given target variable. These quantiles can be used to compute well-calibrated prediction intervals. To estimate epistemic uncertainty, we propose Orthonormal Certificates (OCs), a collection of diverse non-constant functions that map all training samples to zero. These certificates map out-of-distribution examples to non-zero values, signaling epistemic uncertainty.